39 research outputs found

    An Actor-Centric Approach to Facial Animation Control by Neural Networks For Non-Player Characters in Video Games

    Get PDF
    Game developers increasingly consider the degree to which character animation emulates facial expressions found in cinema. Employing animators and actors to produce cinematic facial animation by mixing motion capture and hand-crafted animation is labor intensive and therefore expensive. Emotion corpora and neural network controllers have shown promise toward developing autonomous animation that does not rely on motion capture. Previous research and practice in disciplines of Computer Science, Psychology and the Performing Arts have provided frameworks on which to build a workflow toward creating an emotion AI system that can animate the facial mesh of a 3d non-player character deploying a combination of related theories and methods. However, past investigations and their resulting production methods largely ignore the emotion generation systems that have evolved in the performing arts for more than a century. We find very little research that embraces the intellectual process of trained actors as complex collaborators from which to understand and model the training of a neural network for character animation. This investigation demonstrates a workflow design that integrates knowledge from the performing arts and the affective branches of the social and biological sciences. Our workflow begins at the stage of developing and annotating a fictional scenario with actors, to producing a video emotion corpus, to designing training and validating a neural network, to analyzing the emotion data annotation of the corpus and neural network, and finally to determining resemblant behavior of its autonomous animation control of a 3d character facial mesh. The resulting workflow includes a method for the development of a neural network architecture whose initial efficacy as a facial emotion expression simulator has been tested and validated as substantially resemblant to the character behavior developed by a human actor

    How Actors Can Animate Game Characters: Integrating Performance Theory in the Emotion Model of a Game Character

    No full text
    Despite the development of sophisticated emotion models, game character facial animation is still often completed with laborious hand-controlled key framing, only marginally assisted by automation. Behavior trees and animation state machines are used mostly to manage animation transitions of physical business, like walking or lifting objects. Attempts at automated facial animation rely on discrete facial iconic emotion poses, thus resulting in mechanical “acting”. The techniques of acting instructor and theorist Sanford Meisner reveal a process of role development whose character model resembles components of Appraisal Theory. The similarity conjures an experiment to discover if an “emotion engine” and workflow method can model the emotions of an autonomous animated character using actor-centric techniques. Success would allow an animation director to autonomously animate a character’s face with the subtlety of gradient expressions. Using a head-shot video stream of one of two actors performing a structured Meisner-esque improvisation as the primary data source, this research demonstrates the viability of an actor-centric workflow to create an autonomous facial animation system
    corecore